# Korean Pretraining
Bert Base
A Korean-pretrained BERT model developed by the KLUE benchmark team, supporting various Korean understanding tasks
Large Language Model
Transformers Korean

B
klue
129.68k
47
Roberta Base
A RoBERTa model pretrained on Korean, suitable for various Korean natural language processing tasks.
Large Language Model
Transformers Korean

R
klue
1.2M
33
Koelectra Base V3 Generator
Apache-2.0
KoELECTRA v3 is a Korean pretrained language model based on the ELECTRA architecture, particularly suitable for Korean text processing tasks.
Large Language Model
Transformers Korean

K
monologg
3,003
6
Koelectra Base Generator
Apache-2.0
KoELECTRA is a Korean pretrained language model based on the ELECTRA architecture, developed by monologg. This model serves as the generator component, focusing on representation learning for Korean text.
Large Language Model
Transformers Korean

K
monologg
31
0
Featured Recommended AI Models